HDNN: a cross-platform MLIR dialect for deep neural networks

نویسندگان

چکیده

Abstract This paper presents HDNN, a proof-of-concept MLIR dialect for cross-platform computing specialized in deep neural networks. As target devices, HDNN supports CPUs, GPUs and TPUs. In this paper, we provide comprehensive description of the dialect, outlining how novel approach aims to solve $$P^3$$ P 3 problem parallel programming (portability, productivity, performance). An program is device-agnostic, i.e., only device specifier has be changed run given workload one or another. Moreover, been designed domain-specific language, which ultimately helps productivity. Finally, relies on optimized libraries heavy, performance-critical workloads. evaluated against other state-of-the-art machine learning frameworks all hardware platforms achieving excellent performance. We conclude that ideas concepts used can crucial designing future generation compilers languages overcome challenges forthcoming heterogeneous era.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Cystoscopy Image Classication Using Deep Convolutional Neural Networks

In the past three decades, the use of smart methods in medical diagnostic systems has attractedthe attention of many researchers. However, no smart activity has been provided in the eld ofmedical image processing for diagnosis of bladder cancer through cystoscopy images despite the highprevalence in the world. In this paper, two well-known convolutional neural networks (CNNs) ...

متن کامل

A representer theorem for deep neural networks

We propose to optimize the activation functions of a deep neural network by adding a corresponding functional regularization to the cost function. We justify the use of a second-order total-variation criterion. This allows us to derive a general representer theorem for deep neural networks that makes a direct connection with splines and sparsity. Specifically, we show that the optimal network c...

متن کامل

Cross-Lingual Pronoun Prediction with Deep Recurrent Neural Networks

In this paper we present our winning system in the WMT16 Shared Task on CrossLingual Pronoun Prediction, where the objective is to predict a missing target language pronoun based on the target and source sentences. Our system is a deep recurrent neural network, which reads both the source language and target language context with a softmax layer making the final prediction. Our system achieves ...

متن کامل

Training Deep Convolutional Neural Networks with Resistive Cross-Point Devices

In a previous work we have detailed the requirements for obtaining maximal deep learning performance benefit by implementing fully connected deep neural networks (DNN) in the form of arrays of resistive devices. Here we extend the concept of Resistive Processing Unit (RPU) devices to convolutional neural networks (CNNs). We show how to map the convolutional layers to fully connected RPU arrays ...

متن کامل

Evolving Deep Neural Networks

Œe success of deep learning depends on €nding an architecture to €t the task. As deep learning has scaled up to more challenging tasks, the architectures have become dicult to design by hand. Œis paper proposes an automated method, CoDeepNEAT, for optimizing deep learning architectures through evolution. By extending existing neuroevolution methods to topology, components, and hyperparameters,...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: The Journal of Supercomputing

سال: 2022

ISSN: ['0920-8542', '1573-0484']

DOI: https://doi.org/10.1007/s11227-022-04417-3